Proximal methods for sparse optimal scoring and discriminant analysis

نویسندگان

چکیده

Linear discriminant analysis (LDA) is a classical method for dimensionality reduction, where vectors are sought to project data lower dimensional space optimal separability of classes. Several recent papers have outlined strategies, based on exploiting sparsity the vectors, performing LDA in high-dimensional setting number features exceeds observations data. However, many these proposed methods lack scalable solution underlying optimization problems. We consider an scheme solving sparse scoring formulation block coordinate descent. Each iteration this algorithm requires update vector, which admits analytic formula, and corresponding convex subproblem; we will propose several variants proximal gradient or alternating direction multipliers used solve subproblem. show that per-iteration cost scales linearly dimension provided restricted regularization terms employed, cubically worst case. Furthermore, establish when descent framework generates convergent subsequences iterates, then converge stationary points problem. demonstrate effectiveness our new with empirical results classification Gaussian sets drawn from benchmarking repositories, including time-series multispectral X-ray data, provide Matlab R implementations schemes.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse semiparametric discriminant analysis

In recent years, a considerable amount of work has been devoted to generalizing linear discriminant analysis to overcome its incompetence for high-dimensional classification (Witten and Tibshirani, 2011, Cai and Liu, 2011, Mai et al., 2012 and Fan et al., 2012). In this paper, we develop high-dimensional sparse semiparametric discriminant analysis (SSDA) that generalizes the normal-theory discr...

متن کامل

Sparse Discriminant Analysis

Classi cation in high-dimensional feature spaces where interpretation and dimension reduction are of great importance is common in biological and medical applications. For these applications standard methods such as microarrays, 1D NMR, and spectroscopy have become everyday tools for measuring thousands of features in samples of interest. The samples are often costly and therefore many problems...

متن کامل

Proximal Methods for Sparse Hierarchical Dictionary Learning

We propose to combine two approaches for modeling data admitting sparse representations: on the one hand, dictionary learning has proven effective for various signal processing tasks. On the other hand, recent work on structured sparsity provides a natural framework for modeling dependencies between dictionary elements. We thus consider a tree-structured sparse regularization to learn dictionar...

متن کامل

Proximal Methods for Hierarchical Sparse Coding

Sparse coding consists in representing signals as sparse linear combinations of atoms selected from a dictionary. We consider an extension of this framework where the atoms are further assumed to be embedded in a tree. This is achieved using a recently introduced tree-structured sparse regularization norm, which has proven useful in several applications. This norm leads to regularized problems ...

متن کامل

Sparse Uncorrelated Linear Discriminant Analysis

In this paper, we develop a novel approach for sparse uncorrelated linear discriminant analysis (ULDA). Our proposal is based on characterization of all solutions of the generalized ULDA. We incorporate sparsity into the ULDA transformation by seeking the solution with minimum `1-norm from all minimum dimension solutions of the generalized ULDA. The problem is then formulated as a `1-minimizati...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Advances in data analysis and classification

سال: 2022

ISSN: ['1862-5355', '1862-5347']

DOI: https://doi.org/10.1007/s11634-022-00530-6